Goto

Collaborating Authors

 environmental cost


From the telegraph to AI, our communications systems have always had hidden environmental costs

AIHub

When we post to a group chat or talk to an AI chatbot, we don't think about how these technologies came to be. We take it for granted we can instantly communicate. We only notice the importance and reach of these systems when they're not accessible. Companies describe these systems with metaphors such as the "cloud" or "artificial intelligence", suggesting something intangible. But they are deeply material.


The Hidden Costs of Translation Accuracy: Distillation, Quantization, and Environmental Impact

Vijay, Dhaathri, Vadapalli, Anandaswarup

arXiv.org Artificial Intelligence

The rapid expansion of large language models (LLMs) has heightened concerns about their computational and environmental costs. This study investigates the trade-offs between translation quality and efficiency by comparing full-scale, distilled, and quantized models using machine translation as a case study. We evaluated performance on the Flores+ benchmark and through human judgments of conversational translations in French, Hindi, and Kannada. Our analysis revealed that the full 3.3B FP32 model, while achieving the highest BLEU scores, incurred the largest environmental footprint (~ 0.007-0.008 kg CO2 per run). The distilled 600M FP32 model reduced inference time by 71-78% and carbon emissions by 63-65% compared with the full model, with only minimal reductions in BLEU scores. Human evaluations further showed that even aggressive quantization (INT4) preserved high levels of accuracy and fluency, with differences between models generally minor. These findings demonstrate that model compression strategies can substantially reduce computational demands and environmental impact while maintaining competitive translation quality, though trade-offs are more pronounced in low-resource settings. We argue for evaluation frameworks that integrate efficiency and sustainability alongside accuracy as central dimensions of progress in NLP.


An Analysis of Optimizer Choice on Energy Efficiency and Performance in Neural Network Training

Almog, Tom

arXiv.org Artificial Intelligence

As machine learning models grow increasingly complex and computationally demanding, understanding the environmental impact of training decisions becomes critical for sustainable AI development. This paper presents a comprehensive empirical study investigating the relationship between optimizer choice and energy efficiency in neural network training. We conducted 360 controlled experiments across three benchmark datasets (MNIST, CIFAR-10, CIFAR-100) using eight popular optimizers (SGD, Adam, AdamW, RMSprop, Adagrad, Adadelta, Adamax, NAdam) with 15 random seeds each. Using CodeCarbon for precise energy tracking on Apple M1 Pro hardware, we measured training duration, peak memory usage, carbon dioxide emissions, and final model performance. Our findings reveal substantial trade-offs between training speed, accuracy, and environmental impact that vary across datasets and model complexity. We identify AdamW and NAdam as consistently efficient choices, while SGD demonstrates superior performance on complex datasets despite higher emissions. These results provide actionable insights for practitioners seeking to balance performance and sustainability in machine learning workflows.


Toward Environmentally Equitable AI

Communications of the ACM

The growing adoption of artificial intelligence (AI) has been accelerating across all parts of society, boosting productivity and addressing pressing global challenges such as climate change. Nonetheless, the technological advancement of AI relies on computationally intensive calculations and thus has led to a surge in resource usage and energy consumption. Even putting aside the environmental toll of server manufacturing and supply chains, AI systems can create a huge environmental cost to communities and regions where they are deployed, including air/thermal pollution due to fossil fuel-based electricity generation and further stressed water resources due to AI's staggering water footprint.12,25 To make AI more environmentally friendly and ensure that its overall impacts on climate change are positive, recent studies have pursued multifaceted approaches, including efficient training and inference,5 energy-efficient GPU and accelerator designs,19 carbon forecasting,14 carbon-aware task scheduling,1,21 green cloud infrastructures,2 sustainable AI policies,10,18 and more. Additionally, datacenter operators have also increasingly adopted carbon-free energy (such as solar and wind power) and climate-conscious cooling systems, lowering carbon footprint and direct water consumption.8


The Unbearable Lightness of Prompting: A Critical Reflection on the Environmental Impact of genAI use in Design Education

Lupetti, Maria Luce, Cavallin, Elena, Murray-Rust, Dave

arXiv.org Artificial Intelligence

Design educators are finding ways to support students in skillfully using Generative Artificial Intelligence (GenAI) tools in their practices while encouraging the critical scrutiny of ethical and social issues around these technologies. However, the problem of environmental sustainability remains largely unaddressed. There is a lack of both resources to grasp the environmental costs of genAI in education and a lack of shared practices around the issue. This work contributes filling this gap by counting the energy costs of using genAI in design education and critically reflecting on the impact of these costs. We leverage the image data collected during a genAI workshop for designers held in 2023 with 49 students, to calculate the energy costs of these types of activities. The results reveal that a genAI workshop for designers can easily double the energy costs associated with students' use of computers, countering the efforts of educational institutions to minimize their energy expenditure. We critically reflect on this finding to distill a set of five alternative stances, with related actions, that can support a conscious use of genAI in design education, while respecting individual positions. The work contributes to the field of design pedagogy, and education more broadly, by bringing together ways for educators to reflect on their practices and informing the future development of educational programs around genAI.


Towards Environmentally Equitable AI

Hajiesmaili, Mohammad, Ren, Shaolei, Sitaraman, Ramesh K., Wierman, Adam

arXiv.org Artificial Intelligence

Nonetheless, the technological advancement of AI relies on computationally intensive calculations and thus has led to a surge in resource usage and energy consumption. Even putting aside the environmental toll of server manufacturing and supply chains, AI systems can create a huge environmental cost to communities and regions where they are deployed, including air/thermal pollution due to fossil fuel-based electricity generation and further stressed water resources due to AI's staggering water footprint [12, 25]. To make AI more environmentally friendly and ensure that its overall impacts on climate change are positive, recent studies have pursued multi-faceted approaches, including efficient training and inference [5], energy-efficient GPU and accelerator designs [19], carbon forecasting[14], carbon-aware task scheduling[1, 21], green cloud infrastructures[2], sustainable AI policies [10, 18], and more. Additionally, data center operators have also increasingly adopted carbon-free energy(such as solar and wind power) and climate-conscious cooling systems, lowering carbon footprint and direct water consumption [8]. Although these initiatives are encouraging, unfortunately, a worrisome outcome-- environmental inequity -- has emerged [3]. That is, minimizing the total environmental cost of a globally deployed AI system across multiple regions does not necessarily mean that each region is treated equitably. In fact, the environmental cost of AI is often disproportionately higher in certain disadvantaged regions than in others. Even worse, AI's environmental inequity can be amplified by existing environmental equity agnostic resource allocation, load balancing, and scheduling algorithms and compounded by enduring socioeconomic disparities between regions.


Towards Socially and Environmentally Responsible AI

Li, Pengfei, Liu, Yejia, Yang, Jianyi, Ren, Shaolei

arXiv.org Artificial Intelligence

The sharply increasing sizes of artificial intelligence (AI) models come with significant energy consumption and environmental footprints, which can disproportionately impact certain (often marginalized) regions and hence create environmental inequity concerns. Moreover, concerns with social inequity have also emerged, as AI computing resources may not be equitably distributed across the globe and users from certain disadvantaged regions with severe resource constraints can consistently experience inferior model performance. Importantly, the inequity concerns that encompass both social and environmental dimensions still remain unexplored and have increasingly hindered responsible AI. In this paper, we leverage the spatial flexibility of AI inference workloads and propose equitable geographical load balancing (GLB) to fairly balance AI's regional social and environmental costs. Concretely, to penalize the disproportionately high social and environmental costs for equity, we introduce $L_q$ norms as novel regularization terms into the optimization objective for GLB decisions. Our empirical results based on real-world AI inference traces demonstrate that while the existing GLB algorithms result in disproportionately large social and environmental costs in certain regions, our proposed equitable GLB can fairly balance AI's negative social and environmental costs across all the regions.


Comparative study of microgrid optimal scheduling under multi-optimization algorithm fusion

Duan, Hongyi, Li, Qingyang, Li, Yuchen, Zhang, Jianan, Xie, Yuming

arXiv.org Artificial Intelligence

As global attention on renewable and clean energy grows, the research and implementation of microgrids become paramount. This paper delves into the methodology of exploring the relationship between the operational and environmental costs of microgrids through multi-objective optimization models. By integrating various optimization algorithms like Genetic Algorithm, Simulated Annealing, Ant Colony Optimization, and Particle Swarm Optimization, we propose an integrated approach for microgrid optimization. Simulation results depict that these algorithms provide different dispatch results under economic and environmental dispatch, revealing distinct roles of diesel generators and micro gas turbines in microgrids. Overall, this study offers in-depth insights and practical guidance for microgrid design and operation.


As the AI industry booms, what toll will it take on the environment?

The Guardian

One question that ChatGPT can't quite answer: how much energy do you consume? "As an AI language model, I don't have a physical presence or directly consume energy," it'll say, or: "The energy consumption associated with my operations is primarily related to the servers and infrastructure used to host and run the model." Google's Bard is even more audacious. "My carbon footprint is zero," it claims. Asked about the energy that is consumed in its creation and training, it responds: "not publicly known".


AI is entering an era of corporate control - The Verge

Stanford HAI

Private investment in AI decreased for the first time in a decade. Global private investment in AI has been climbing for years but decreased by 26.7 percent from 2021 to $91.9 billion. in 2022. Training big AI models has environmental costs. A 2022 paper estimates that training a large AI language model called BLOOM emitted 25 times as much carbon as that of flying one passenger from New York to San Francisco and back. By comparison, OpenAI's GPT-3 was estimated to have a carbon cost 20 times that of BLOOM.